Simplifying mixture Models through Function Approximation Simplifying Mixture Models through Function Approximation
نویسندگان
چکیده
The finite mixture model is widely used in various statistical learning problems. However, the model obtained may contain a large number of components, making it inefficient in practical applications. In this paper, we propose to simplify the mixture model by first grouping similar components together and then performing local fitting through function approximation. By using the squared loss to measure the distance between mixture models, our algorithm naturally combines the two different tasks of component clustering and model simplification. The proposed method can be used to speed up various algorithms that use mixture models during training (e.g., Bayesian filtering, belief propagation) or testing (e.g., kernel density estimation, SVM testing). Encouraging results are observed in the experiments on density estimation, clustering-based image segmentation and simplification of SVM decision functions.
منابع مشابه
A New High-order Takagi-Sugeno Fuzzy Model Based on Deformed Linear Models
Amongst possible choices for identifying complicated processes for prediction, simulation, and approximation applications, high-order Takagi-Sugeno (TS) fuzzy models are fitting tools. Although they can construct models with rather high complexity, they are not as interpretable as first-order TS fuzzy models. In this paper, we first propose to use Deformed Linear Models (DLMs) in consequence pa...
متن کاملVerification of an Evolutionary-based Wavelet Neural Network Model for Nonlinear Function Approximation
Nonlinear function approximation is one of the most important tasks in system analysis and identification. Several models have been presented to achieve an accurate approximation on nonlinear mathematics functions. However, the majority of the models are specific to certain problems and systems. In this paper, an evolutionary-based wavelet neural network model is proposed for structure definiti...
متن کاملApproximate Inference for Generic Likelihoods via Density-Preserving GMM Simplification
We consider recursive Bayesian filtering where the posterior is represented as a Gaussian mixture model (GMM), and the likelihood function as a sum of scaled Gaussians (SSG). In each iteration of filtering, the number of components increases. We propose an algorithm for simplifying a GMM into a reduced mixture model with fewer components, which is based on maximizing a variational lower bound o...
متن کاملSimplifying spline models
We present a new approach for simplifying models composed of spline patches. Given an input model, the algorithm computes a new approximation of the model in terms of cubic triangular B ezier patches. It performs a series of geometric operations, consisting of patch merging and swapping diagonals, and makes use of patch connectivity information to generate C-LODs (curved levelsof-detail). Each ...
متن کاملA TS Fuzzy Model Derived from a Typical Multi-Layer Perceptron
In this paper, we introduce a Takagi-Sugeno (TS) fuzzy model which is derived from a typical Multi-Layer Perceptron Neural Network (MLP NN). At first, it is shown that the considered MLP NN can be interpreted as a variety of TS fuzzy model. It is discussed that the utilized Membership Function (MF) in such TS fuzzy model, despite its flexible structure, has some major restrictions. After modify...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2006